AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Large-scale Distillation

# Large-scale Distillation

Lamini GPT 774M
Based on the gpt2-large architecture, fine-tuned on 2.58 million instruction-tuning samples, this 774M-parameter language model is suitable for natural language instruction response tasks
Large Language Model Transformers English
L
MBZUAI
862
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase